Goto

Collaborating Authors

 shannon vallor


The Good Robot Podcast: Featuring Shannon Vallor

AIHub

Hosted by Eleanor Drage and Kerry Mackereth, The Good Robot is a podcast which explores the many complex intersections between gender, feminism and technology. In this episode we chat to Shannon Vallor, the Baillie Gifford professor in the ethics and data of AI at the University of Edinburgh and the Director for the Centre for Technomoral Futures. We talk about feminist care ethics; technologies, vices and virtues; why Aristotle believed that the people who make technology should be excluded from citizenship; and why we still don't have the kinds of robots that we imagined that we'd have in the early 2000s. We also discuss Shannon's new book, The AI Mirror, which is now available for pre-order. Professor Shannon Vallor is the Baillie Gifford Chair in the Ethics of Data and Artificial Intelligence at the Edinburgh Futures Institute (EFI) at the University of Edinburgh, where she is also appointed in Philosophy.


"That Wasn't My Intent": Reenvisioning Ethics in the Information Age

#artificialintelligence

WENDELL WALLACH: It gives me great pleasure to welcome my longtime colleague Shannon Vallor to this Artificial Intelligence & Equality podcast. Shannon and I have both expressed concerns that ethics and ethical philosophy is inadequate for addressing the issues posed by artificial intelligence (AI) and other emerging technologies, so I have been looking forward to our having a conversation about why that is the case and ideas for reenvisioning ethics and empowering it for the information age. Before we get to that conversation, let me introduce Shannon to our listeners, provide a very cursory overview of how ethical theories are understood within academic circles, and provide Shannon with the opportunity to introduce you to the research and insights for which she is best known. Again, before turning to Shannon, let me make sure that listeners have at least a cursory understanding of the field of ethics. Ethical theories are often said to all fall into two big tents, and one of those tents--the determination of what is right, good, or just--derives from following the rules or doing your duty. Often these rules are captured in high-level principles, that the rules can be the Ten Commandments or the four principles of biomedical ethics. In India they might be Yama and Niyama. Each culture has its own set of rules. Even Asimov's "Three Laws of Robotics" do count as rules meant to direct the behavior of robots. All of these theories are said to be deontological, a term going back to the Greeks, referring to duties, and it is basically saying that rules and duties define ethics--but of course there are outstanding questions about whose rules, what to do when rules conflict, and how you deal with situations when people prioritize the rules very differently. At the end of the 18th and beginning of the 19th centuries, Jeremy Bentham, a British philosopher, came up with a totally different approach to ethics, which is sometimes called utilitarianism or consequentialism.